Goto

Collaborating Authors

 power ratio


Probabilistic Spatiotemporal Modeling of Day-Ahead Wind Power Generation with Input-Warped Gaussian Processes

Li, Qiqi, Ludkovski, Mike

arXiv.org Artificial Intelligence

Wind power is one of the fastest-growing renewable energy sectors and a key pillar for the transition to a carbon-free economy. In 2023, energy from wind accounted for 10.2% of all U.S. utility-scale electricity generation [54]. Being intrinsically weather-driven, wind power injects uncertainty into the balancing of power demand and generation. On the daily operational time scale, quantifying the asset-specific and area-wide uncertainty of renewable generation for the next day is an essential ingredient of grid management. Specifically, grid operators need probabilistic spatiotemporal forecasting of wind power in order to appropriately set grid reserves, ensure grid stability, and optimize dispatch of grid resources. Our goal is to develop a statistical framework for short-term wind power generation simulations across space and time. This project is motivated by working with a large dataset of wind generation in the Electric Reliability Council of Texas (ERCOT) region and is geared to the concrete practical concerns faced by electricity grid operators. We refer to our team's related publications [8, 7, 52, 38] that employ similar simulations for various downstream risk management tasks; other use cases are discussed, among others, in [27, 33, 35, 58].


EEG-GPT: Exploring Capabilities of Large Language Models for EEG Classification and Interpretation

Kim, Jonathan W., Alaa, Ahmed, Bernardo, Danilo

arXiv.org Artificial Intelligence

Large language models (LLMs) such as ChatGPT have garnered substantial attention in the media and among the machine learning (ML) community. LLMs represent a pivotal paradigm shift in artificial intelligence (AI), consisting of transformer architectures substantially larger in scale compared to their predecessors, such as Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) networks [1], and leverage internet-scale text corpora, thus excelling not only on text completion tasks but demonstrating emergent capabilities in rudimentary language reasoning [2, 3]. LLMs display several features conducive to the small data regime present in most EEG datasets, where the largest datasets typically have on the order of only thousands of EEGs. Primarily, LLMs have the capability to perform fewand even zero-shot learning [4]. Recent research has investigated how LLMs can perform few-shot learning in domains ranging from cancer drug synergy prediction to cardiac signal analysis [5, 6]. Other work has demonstrated the ability of LLMs to outperform experts in annotating political Twitter messages with zero-shot learning [7]. Additionally, previous work has shown that transformer architectures are capable of utilizing in-context learning for zero-shot tasks - in other words, utilizing information provided in the prompt in order to yield better performance on various tasks [8].